On the Implementation of Programing Languages with Neural Nets
نویسندگان
چکیده
In this paper we show that programming languages are implementable on neural nets, namely, neural nets can be designed to solve any (computable) high level programming task. Constructions like the one that follows can also be used to built large scale neural nets that integrate learning and control structures. We use a very simple model of analog recurrent neural nets and a number-theoretic approach. Our results can be generalised to other models of neural computation, and also to support structured data types (through coding techniques). The use of such a model for computability analysis is due to Hava Siegelmann. In [Siegelmann and Sontag 92, Siegelmann and Sontag 95] Hava Siegelmann and Eduardo Sontag used it to establish lower bounds on the computational power of analog recurrent neural nets. An analog recurrent neural net is a dynamic system with application map of the form x → (t+1) = φ(x → (t) , u → (t)) (1) where x i (t) denotes de activity (firing frequency) of neuron i at time t within a population of N interconnected neurons, and u i (t) the input bit of input stream i at time t within a set of M input channels. The application map φ is taken as a composition of an affine map with a picewise linear map of the interval [0,1], known as the saturated sigmoid: σ(x) = 0 if x<0 x if 0≤x≤1 1 if x>1 (2)
منابع مشابه
Solving Fuzzy Equations Using Neural Nets with a New Learning Algorithm
Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...
متن کاملSolving Fuzzy Equations Using Neural Nets with a New Learning Algorithm
Artificial neural networks have the advantages such as learning, adaptation, fault-tolerance, parallelism and generalization. This paper mainly intends to offer a novel method for finding a solution of a fuzzy equation that supposedly has a real solution. For this scope, we applied an architecture of fuzzy neural networks such that the corresponding connection weights are real numbers. The ...
متن کاملPrediction of Gain in LD-CELP Using Hybrid Genetic/PSO-Neural Models
In this paper, the gain in LD-CELP speech coding algorithm is predicted using three neural models, that are equipped by genetic and particle swarm optimization (PSO) algorithms to optimize the structure and parameters of neural networks. Elman, multi-layer perceptron (MLP) and fuzzy ARTMAP are the candidate neural models. The optimized number of nodes in the first and second hidden layers of El...
متن کاملPrediction of Gain in LD-CELP Using Hybrid Genetic/PSO-Neural Models
In this paper, the gain in LD-CELP speech coding algorithm is predicted using three neural models, that are equipped by genetic and particle swarm optimization (PSO) algorithms to optimize the structure and parameters of neural networks. Elman, multi-layer perceptron (MLP) and fuzzy ARTMAP are the candidate neural models. The optimized number of nodes in the first and second hidden layers of El...
متن کاملSymbolic Processing in Neural Networks
Abstract. In this paper we show that programming languages can be translated on recurrent (analog, rational weighted) neural nets. Implementation of programming languages in neural nets turns to be not only theoretical exciting, but has also some practical implications in the recent efforts to merge symbolic and subsymbolic computation. To be of some use, it should be carried in a context of bo...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2000